skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Zuckerman, David"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. One of the earliest models of weak randomness is the Chor-Goldreich (CG) source. A (t,n,k)- CG source is a sequence of random variables X =(x1,…,xt)∼({0,1}n)t, where each Xi has min-entropy k conditioned on any fixing of x1,…,xi−1. Chor and Goldreich proved that there is no deterministic way to extract randomness from such a source. Nevertheless, Doron, Moshkovitz, Oh, and Zuckerman showed that there is a deterministic way to condense a CG source into a string with small entropy gap. They gave applications of such a condenser to simulating randomized algorithms with small error and to certain cryptographic tasks. They studied the case where the block length n and entropy rate k/n are both constant. We study the much more general setting where the block length can be arbitrarily large, and the entropy rate can be arbitrarily small. We construct the first explicit condenser for CG sources in this setting, and it can be instantiated in a number of different ways. When the entropy rate of the CG source is constant, our condenser requires just a constant number of blocks t to produce an output with entropy rate 0.9, say. In the low entropy regime, using t=poly(n) blocks, our condenser can achieve output entropy rate 0.9 even if each block has just 1 bit of min-entropy. Moreover, these condensers have exponentially small error. Finally, we provide strong existential and impossibility results. For our existential result, we show that a random function is a seedless condenser (with surprisingly strong parameters) for any small family of sources. As a corollary, we get new existential results for seeded condensers and condensers for CG sources. For our impossibility result, we show the latter result is nearly tight, by giving a simple proof that the output of any condenser for CG sources must inherit the entropy gap of (one block of) its input. 
    more » « less
  2. A Chor–Goldreich (CG) source is a sequence of random variables X = X1 ∘ … ∘ Xt, where each Xi ∼ {0,1}d and Xi has δ d min-entropy conditioned on any fixing of X1 ∘ … ∘ Xi−1. The parameter 0<δ≤ 1 is the entropy rate of the source. We typically think of d as constant and t as growing. We extend this notion in several ways, defining almost CG sources. Most notably, we allow each Xi to only have conditional Shannon entropy δ d. We achieve pseudorandomness results for almost CG sources which were not known to hold even for standard CG sources, and even for the weaker model of Santha–Vazirani sources: We construct a deterministic condenser that on input X, outputs a distribution which is close to having constant entropy gap, namely a distribution Z ∼ {0,1}m for m ≈ δ dt with min-entropy m−O(1). Therefore, we can simulate any randomized algorithm with small failure probability using almost CG sources with no multiplicative slowdown. This result extends to randomized protocols as well, and any setting in which we cannot simply cycle over all seeds, and a “one-shot” simulation is needed. Moreover, our construction works in an online manner, since it is based on random walks on expanders. Our main technical contribution is a novel analysis of random walks, which should be of independent interest. We analyze walks with adversarially correlated steps, each step being entropy-deficient, on good enough lossless expanders. We prove that such walks (or certain interleaved walks on two expanders), starting from a fixed vertex and walking according to X1∘ … ∘ Xt, accumulate most of the entropy in X. 
    more » « less
  3. We construct explicit deterministic extractors for polynomial images of varieties, that is, distributions sampled by applying a low-degree polynomial map 𝑓 to an element sampled uniformly at random from a 𝑘-dimensional variety 𝑉. This class of sources generalizes both polynomial sources, studied by Dvir, Gabizon and Wigderson (FOCS 2007, Comput. Complex. 2009), and variety sources, studied by Dvir (CCC 2009, Comput. Complex. 2012). Assuming certain natural non-degeneracy conditions on the map 𝑓 and the variety 𝑉 , which in particular ensure that the source has enough min-entropy, we extract almost all the min-entropy of the distribution. Unlike the Dvir–Gabizon–Wigderson and Dvir results, our construction works over large enough finite fields of arbitrary characteristic. One key part of our construction is an improved deterministic rank extractor for varieties. As a by-product, we obtain explicit Noether normalization lemmas for affine varieties and affine algebras. Additionally, we generalize a construction of affine extractors with exponentially small error due to Bourgain, Dvir and Leeman (Comput. Complex. 2016) by extending it to all finite prime fields of quasipolynomial size. 
    more » « less
  4. Existing proofs that deduce BPP = P from circuit lower bounds convert randomized algorithms into deterministic algorithms with a large polynomial slowdown. We convert randomized algorithms into deterministic ones with little slowdown . Specifically, assuming exponential lower bounds against randomized NP ∩ coNP circuits, formally known as randomized SVN circuits, we convert any randomized algorithm over inputs of length n running in time t ≥ n into a deterministic one running in time t 2+α for an arbitrarily small constant α > 0. Such a slowdown is nearly optimal for t close to n , since under standard complexity-theoretic assumptions, there are problems with an inherent quadratic derandomization slowdown. We also convert any randomized algorithm that errs rarely into a deterministic algorithm having a similar running time (with pre-processing). The latter derandomization result holds under weaker assumptions, of exponential lower bounds against deterministic SVN circuits. Our results follow from a new, nearly optimal, explicit pseudorandom generator fooling circuits of size s with seed length (1+α)log s , under the assumption that there exists a function f ∈ E that requires randomized SVN circuits of size at least 2 (1-α′) n , where α = O (α)′. The construction uses, among other ideas, a new connection between pseudoentropy generators and locally list recoverable codes. 
    more » « less
  5. null (Ed.)
  6. We explicitly construct an extractor for two independent sources on 𝑛 bits, each with min-entropy at least log𝐶𝑛 for a large enough constant 𝐶. Our extractor outputs one bit and has error 𝑛−Ω(1). The best previous extractor, by Bourgain, required each source to have min-entropy .499𝑛. A key ingredient in our construction is an explicit construction of a monotone, almost-balanced Boolean function on 𝑛 bits that is resilient to coalitions of size 𝑛1−𝛿 for any 𝛿>0. In fact, our construction is stronger in that it gives an explicit extractor for a generalization of non-oblivious bit-fixing sources on 𝑛 bits, where some unknown 𝑛−𝑞 bits are chosen almost polylog(𝑛)-wise independently, and the remaining 𝑞=𝑛1−𝛿 bits are chosen by an adversary as an arbitrary function of the 𝑛−𝑞 bits. The best previous construction, by Viola, achieved 𝑞=𝑛1/2–𝛿. Our explicit two-source extractor directly implies an explicit construction of a 2(loglog𝑁)𝑂(1)-Ramsey graph over 𝑁 vertices, improving bounds obtained by Barak et al. and matching an independent work by Cohen. 
    more » « less
  7. null (Ed.)
  8. null (Ed.)
  9. null (Ed.)